On Rényi Divergence Measures for Continuous Alphabet Sources

نویسندگان

  • MANUEL GIL
  • Fady Alajaji
چکیده

The idea of ‘probabilistic distances’ (also called divergences), which in some sense assess how ‘close’ two probability distributions are from one another, has been widely employed in probability, statistics, information theory, and related fields. Of particular importance due to their generality and applicability are the Rényi divergence measures. While the closely related concept of Rényi entropy of a probability distribution has been studied extensively, and closed-form expressions for the most common univariate and multivariate continuous distributions have been obtained and compiled [57, 45, 62], the literature currently lacks the corresponding compilation for continuous Rényi divergences. The present thesis addresses this issue for the analytically tractable cases. Closed-form expressions for Kullback-Leibler divergences are also derived and compiled, as they can be seen as an extension by continuity of the Rényi divergences. Additionally, we establish a connection between Rényi divergence and the variance of the log-likelihood ratio of two distributions, which extends the work of Song [57] on the relation between Rényi entropy and the log-likelihood function, and which becomes practically useful in light of the Rényi divergence expressions we have derived. Lastly, we consider the Rényi divergence rate between two stationary Gaussian processes.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Universal Estimation of Information Measures for Analog Sources

This monograph presents an overview of universal estimation of information measures for continuous-alphabet sources. Special attention is given to the estimation of mutual information and divergence based on independent and identically distributed (i.i.d.) data. Plug-in methods, partitioning-based algorithms, nearest-neighbor algorithms as well as other approaches are reviewed, with particular ...

متن کامل

Divergence Measures for Dna Segmentation

Entropy-based divergence measures have shown promising results in many areas of engineering and image processing. In this study, we use the Jensen-Shannon and Jensen-Rényi divergence measures for DNA segmentation. Based on these information theoretic measures and protein shape coded in DNA, we propose a new approach to the problem of finding the borders between coding and noncoding DNA regions....

متن کامل

A note on decision making in medical investigations using new divergence measures for intuitionistic fuzzy sets

Srivastava and Maheshwari (Iranian Journal of Fuzzy Systems 13(1)(2016) 25-44) introduced a new divergence measure for intuitionisticfuzzy sets (IFSs). The properties of the proposed divergence measurewere studied and the efficiency of the proposed divergence measurein the context of medical diagnosis was also demonstrated. In thisnote, we point out some errors in ...

متن کامل

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto [1] proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code on a given discrete memoryless channel. In this paper we present a simple derivation of the Arimoto converse based on the dataprocessing inequality for Rényi divergence. The method has two benefits. First, it generalizes to codes with feedback and gives the simplest proof of the stro...

متن کامل

Image Registration and Segmentation by Maximizing the Jensen-Rényi Divergence

Information theoretic measures provide quantitative entropic divergences between two probability distributions or data sets. In this paper, we analyze the theoretical properties of the Jensen-Rényi divergence which is defined between any arbitrary number of probability distributions. Using the theory of majorization, we derive its maximum value, and also some performance upper bounds in terms o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011